2023-09-13 15:15:29.AIbase.1.3k
Taotian Group Collaborates with Aicheng Technology to Open Source the Megatron-LLaMA Large Model Training Framework
The Megatron-LLaMA framework, jointly developed by Taotian Group and Aicheng Technology, aims to enhance the training performance of large language models while reducing training costs and maintaining compatibility with the LLaMA community, achieving a 176% speedup in training on 32 GPUs.